Active Learning for Probability Estimation Using Jensen-Shannon Divergence
نویسندگان
چکیده
Active selection of good training examples is an important approach to reducing data-collection costs in machine learning; however, most existing methods focus on maximizing classification accuracy. In many applications, such as those with unequal misclassification costs, producing good class probability estimates (CPEs) is more important than optimizing classification accuracy. We introduce novel approaches to active learning based on the algorithms BootstrapLV and ACTIVEDECORATE, by using Jensen-Shannon divergence (a similarity measure for probability distributions) to improve sample selection for optimizing CPEs. Comprehensive experimental results demonstrate the benefits of our
منابع مشابه
Discrimination Measure of Correlations in a Population of Neurons by Using the Jensen-Shannon Divergence
The significance of synchronized spikes fired by nearby neurons for perception is still unclear. To evaluate how reliably one can decide if a given response on the population coding of sensory information comes from the full distribution, or from the product of independent distributions from each cell, we used recorded responses of pairs of single neurons in primary visual cortex of macaque mon...
متن کاملJensen divergence based on Fisher's information
The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, is very sensitive to the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So, it is appropriate and ...
متن کاملOn Unified Generalizations of Relative Jensen–shannon and Arithmetic–geometric Divergence Measures, and Their Properties Pranesh Kumar and Inder Jeet Taneja
Abstract. In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ2−divergence, relative J – divergence, relative Jensen – Shannon divergence and relative Arithmetic – Geometric divergence. All the generalizations considered can be written as particular case...
متن کاملBounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler [13] relative information and Jeffreys [12] Jdivergence. Sibson [17] Jensen-Shannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means....
متن کاملAnalysis of symbolic sequences using the Jensen-Shannon divergence.
We study statistical properties of the Jensen-Shannon divergence D, which quantifies the difference between probability distributions, and which has been widely applied to analyses of symbolic sequences. We present three interpretations of D in the framework of statistical physics, information theory, and mathematical statistics, and obtain approximations of the mean, the variance, and the prob...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005